As the “AI for Good” global summit kicks off in Geneva, a new UNESCO study unveiled Tuesday highlights a critical challenge facing the rapidly expanding field of artificial intelligence: its colossal energy consumption.
The report suggests that a seemingly simple solution — asking shorter questions — could dramatically curb AI’s environmental footprint.
According to UNESCO, a combination of more concise queries and the use of specialised AI models has the potential to reduce AI energy consumption by up to 90% without compromising performance.
This revelation comes as concerns grow over the power demands of popular generative AI applications.
It also closely follows on the heels of OpenAI CEO Sam Altman’s recently disclosing that each request sent to ChatGPT, their widely used generative AI app, consumes an average of 0.34 Wh of electricity, which is between 10 and 70 times a Google search.
With ChatGPT receiving around a billion requests per day, that amounts to 310 GWh annually, equivalent to the annual electricity consumption of three million people in Ethiopia, for example.
Moreover, UNESCO calculated that AI energy demand is doubling every 100 days as generative AI tools become embedded in everyday life.
“The exponential growth in computational power needed to run these models is placing increasing strain on global energy systems, water resources, and critical minerals, raising concerns about environmental sustainability, equitable access, and competition over limited resources,” the UNESCO report warned.
However, it was able to achieve a nearly 90% reduction in electricity usage by reducing the length of its query, or prompt, as well as by using a smaller AI, without a drop in performance.
Many AI models like ChatGPT are general-purpose models designed to respond to a wide variety of topics, meaning that they must sift through an immense volume of information to formulate and evaluate responses.
The use of smaller, specialised AI models offers major reductions in electricity needed to produce a response.
So did cutting the cutting prompts from 300 to 150 words.
Being already aware of the energy issue, tech giants all now offer miniature versions with fewer parameters of their respective large language models.
For example, Google sells Gemma, Microsoft has Phi-3, and OpenAI has GPT-4o mini. French AI companies have done likewise; for instance, Mistral AI has introduced its model, Ministral.